AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
100-billion parameter large model

# 100-billion parameter large model

Stockmark 2 100B Instruct Beta
MIT
Stockmark-2-100B is a 100-billion parameter large language model focused on Japanese capabilities, pre-trained on 1.5 trillion tokens of multilingual data and enhanced with Japanese synthetic data for improved instruction following.
Large Language Model Transformers Supports Multiple Languages
S
stockmark
1,004
9
Plamo 100b
Other
A 100-billion parameter model trained by Preferred Elements on open-source English-Japanese bilingual datasets, offering both commercial and non-commercial licenses
Large Language Model Transformers Supports Multiple Languages
P
pfnet
178
18
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase